[[File:CSIRO ScienceImage 8130 The computer lab on RV Southern Surveyor.jpg|thumb|right|A computer lab contains a wide range of information technology elements, including hardware, software and storage systems. ]]
Information technology ( IT) is the study or use of , telecommunication systems and other devices to create, process, store, retrieve and transmit information. While the term is commonly used to refer to computers and , it also encompasses other information distribution technologies such as television and telephones. Information technology is an application of computer science and computer engineering.
An information technology system ( IT system) is generally an information system, a communications system, or, more specifically speaking, a Computer — including all hardware, software, and peripheral equipment — operated by a limited group of IT users, and an IT project usually refers to the commissioning and implementation of an IT system.Forbes Technology Council, 16 Key Steps To Successful IT Project Management, published 10 September 2020, accessed 23 June 2023 IT systems play a vital role in facilitating efficient data management, enhancing communication networks, and supporting organizational processes across various industries. Successful IT projects require meticulous planning and ongoing maintenance to ensure optimal functionality and alignment with organizational objectives.
Although humans have been storing, retrieving, manipulating, analysing and communicating information since the earliest writing systems were developed, the term information technology in its modern sense first appeared in a 1958 article published in the Harvard Business Review; authors Harold Leavitt and Thomas L. Whisler commented that "the new technology does not yet have a single established name. We shall call it information technology (IT)." Their definition consists of three categories: techniques for processing, the application of Statistics and mathematical methods to decision-making, and the simulation of higher-order thinking through computer programs..
Ideas of computer science were first mentioned before the 1950s under the Massachusetts Institute of Technology (MIT) and Harvard University, where they had discussed and began thinking of computer circuits and numerical calculations. As time went on, the field of information technology and computer science became more complex and was able to handle the processing of more data. Scholarly articles began to be published from different organizations.
During the mid-1900s, Alan Turing, J. Presper Eckert, and John Mauchly were some of the pioneers of early computer technology. While their main efforts focused on designing the first digital computer, Turing also began to raise questions about artificial intelligence.Henderson, H. (2017). computer science. In H. Henderson, Facts on File science library: Encyclopedia of computer science and technology. (3rd ed.). Online. New York: Facts On File.
Devices have been used to aid computation for thousands of years, probably initially in the form of a tally stick.. The Antikythera mechanism, dating from about the beginning of the first century BC, is generally considered the earliest known mechanical analog computer, and the earliest known geared mechanism. Comparable geared devices did not emerge in Europe until the 16th century, and it was not until 1645 that the first mechanical calculator capable of performing the four basic arithmetical operations was developed. Electronic computers, using either or Vacuum tube, began to appear in the early 1940s. The electromechanical Zuse Z3, completed in 1941, was the world's first programmable computer, and by modern standards one of the first machines that could be considered a complete computing machine. During the Second World War, Colossus developed the first electronic Digital data computer to decrypt German messages. Although it was programmable, it was not general-purpose, being designed to perform only a single task. It also lacked the ability to store its program in memory; programming was carried out using plugs and switches to alter the internal wiring. The first recognizably modern Electronics digital stored-program computer was the Manchester Baby, which ran its first program on 21 June 1948..
The development of in the late 1940s at Bell Laboratories allowed a new generation of computers to be designed with greatly reduced power consumption. The first commercially available stored-program computer, the Ferranti Mark I, contained 4050 valves and had a power consumption of 25 kilowatts. By comparison, the first transistorized computer developed at the University of Manchester and operational by November 1953, consumed only 150 watts in its final version..
Several other breakthroughs in semiconductor technology include the integrated circuit (IC) invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor in 1959, silicon dioxide surface passivation by Carl Frosch and Lincoln Derick in 1955, the first planar silicon dioxide transistors by Frosch and Derick in 1957, the MOSFET demonstration by a Bell Labs team,
By 1984, according to the National Westminster Bank Quarterly Review, the term information technology had been redefined as "the convergence of telecommunications and computing technology (...generally known in Britain as information technology)." We then begin to see the appearance of the term in 1990 contained within documents for the International Organization for Standardization (ISO).Information technology. (2003). In E.D. Reilly, A. Ralston & D. Hemmendinger (Eds.), Encyclopedia of computer science. (4th ed.).
Innovations in technology have already revolutionized the world by the twenty-first century as people have gained access to different online services. This has changed the workforce drastically as thirty percent of U.S. workers were already in careers in this profession. 136.9 million people were personally connected to the Internet, which was equivalent to 51 million households.Stewart, C.M. (2018). Computers. In S. Bronner (Ed.), Encyclopedia of American studies. Online. Johns Hopkins University Press. Along with the Internet, new types of technology were also being introduced across the globe, which has improved efficiency and made things easier across the globe.
As technology revolutionized society, millions of processes could be completed in seconds. Innovations in communication were crucial as people increasingly relied on computers to communicate via telephone lines and cable networks. The introduction of the email was considered revolutionary as "companies in one part of the world could communicate by e-mail with suppliers and buyers in another part of the world...".Northrup, C.C. (2013). Computers. In C. Clark Northrup (Ed.), Encyclopedia of world trade: from ancient times to the present. Online. London: Routledge.
Not only personally, computers and technology have also revolutionized the marketing industry, resulting in more buyers of their products. In 2002, Americans exceeded $28 billion in goods just over the Internet alone while e-commerce a decade later resulted in $289 billion in sales. And as computers are rapidly becoming more sophisticated by the day, they are becoming more used as people are becoming more reliant on them during the twenty-first century.
All DMS consist of components; they allow the data they store to be accessed simultaneously by many users while maintaining its integrity.. All databases are common in one point that the structure of the data they contain is defined and stored separately from the data itself, in a database schema.
In the late 2000s (decade), the XML (XML) has become a popular format for data representation. Although XML data can be stored in normal , it is commonly held in relational databases to take advantage of their "robust implementation verified by years of both theoretical and practical effort." As an evolution of the Standard Generalized Markup Language (SGML), XML's text-based structure offers the advantage of being both machine readable and human-readable.
XML has been increasingly employed as a means of data interchange since the early 2000s, particularly for machine-oriented interactions such as those involved in web-oriented protocols such as SOAP, describing "data-in-transit rather than... data-at-rest".
Massive amounts of data are stored worldwide every day, but unless it can be analyzed and presented effectively it essentially resides in what have been called data tombs: "data archives that are seldom visited". To address that issue, the field of data mining — "the process of discovering interesting patterns and knowledge from large amounts of data" — emerged in the late 1980s.
The disadvantages of e-mail include: the presence of such a phenomenon as spam (massive advertising and viral mailings); the theoretical impossibility of guaranteed delivery of a particular letter; possible delays in message delivery (up to several days); limits on the size of one message and on the total size of messages in the mailbox (personal for users).
Many companies now have IT departments for managing the , networks, and other technical areas of their businesses. Companies have also sought to integrate IT with business outcomes and decision-making through a BizOps or business operations department.
In a business context, the Information Technology Association of America has defined information technology as "the study, design, development, application, implementation, support, or management of computer-based information systems".. The responsibilities of those working in the field include network administration, software development and installation, and the planning and management of an organization's technology life cycle, by which hardware and software are maintained, upgraded, and replaced.
|
|